The following rules apply to sensor systems, whether equipped on a synthmorph, bot, vehicle, habitat, or device. Biological equivalents also follow suit.
=Sensor Types= 
Sensors can be broken down into multiple categories.
==Technological vs Biological== 
Unless otherwise noted, the capabilities of biological sensor organs are equivalent to those of technological devices. Tech sensors are typically fully meshed and have an on-board sensor or device AI, whereas biosense input into the brain or cyberbrain can be accessed by mesh inserts and other implants and shared via tacnet, recorded for lifelog purposes, etc. Tech sensors are powered by small nuclear batteries, standard batteries, or locally broadcast wireless power feeds. Hyperspectral tech sensors can turn off or isolate certain frequencies, whereas biological sense organs perceive the entire spectrum of wavelengths (unless equipped with a sense ﬁlter implant.
==Active vs. Passive== 
Active sensors must actually emit signals, taking a reading based on the signals that are reﬂected back their way. This means, however, that active sensors can be easily detected by any other senses or sensors capable of detecting their signals (See Detecting Sensors). Passive scanners, however, do not emit any signals that might give them away. They simply receive any signals that come their way, either natural or artiﬁcial. Any active sensor can be operated in passive mode, though it will only detect other active scanners of that type and any natural emissions.
==Scanners vs. Software== 
A distinction must be made between the scanning device that actually receives input and then the software that analyzes it. Most sensors come equipped with analysis software overseen by the device’s AI. A standard video camera, for example, records its visual input but does not detect anything on its own. Such a camera can be loaded with all sorts of software such as facial recognition, silhouette analysis, and so forth. Many sensor devices have this software running on their own electronics; sensors that are part of a larger security network, however, usually feed their data to a central processing node where security and sensor AIs oversee the data collected. Software systems can correlate data from multiple devices, acquiring a better picture of the situation than they could from a single sensor.
=Sensor Size and Range= 
Sensors come in a range of sizes, just like other gear (see the Gear Sizes table). Standard portable sensors are small hand-held and easily concealable devices. Most can also be obtained in mini or micro sizes at the same cost (if the gamemaster allows). At the gamemaster’s discretion, some sensors may not be available at certain sizes. Radar, for example, does not come in a size smaller than medium, simply because the wavelengths radar operates on are “physically” large and so require a big enough sensor to read them. Nanoswarms carry nano versions of various sensors. A sensor’s size determines its effective range, as noted on the Radio and Sensor Ranges table.
Many common, everyday objects contain sensors. Spimes can be found in habitat infrastructure, public seating, lighting ﬁxtures, appliances, furniture, retail displays, entranceways, rooftops, transit stops, and randomly embedded or stuck to available surfaces in public areas. Usually meshed and publicly accessible, they most commonly carry micro-sized ﬂat cameras and sometimes others micro, mini, or small sensors, especially microphones or infrared.
=Radio and Sensor Ranges Table= 
|| Size Category || Urban Range || Open Range || Examples ||
|| Nano || 20 meters || 100 meters || Smart Dust, Nanobot/Microbot Swarms ||
|| Micro || 50 meters || 500 meters || Microbugs ||
|| Mini || 1 kilometer || 20 kilometers || Mesh Inserts ||
|| Small || 5 kilometers || 50 kilometers || Ectos, Miniature Radio Farcasters, Portable Sensors ||
|| Medium || 25 kilometers || 250 kilometers || Radio Boosters, Vehicle Sensors ||
|| Large || 500 kilometers || 5,000 kilometers ||   ||
=Detecting Sensors= 
The ubiquity of sensors in public areas can make it a challenge to ﬁnd all of them. This is especially true when you also take into consideration sensors and spimes carried on moving people, bots, vehicles, and infrastructure.
Visually spotting sensors can be hard given their often tiny size. Apply a −10 modiﬁer to detect small sensors, a −20 to detect mini sensors, and −30 to detect micro sensors. An additional modiﬁer may apply for sensors that are actively concealed. Nanoswarm sensors cannot be spotted without nanoscopic vision, a nanodetector, or a nanoswarm of your own. If a character is attempting to visually spot all of the sensors in an area, treat it as a Task Action with a timeframe appropriate to the sensor coverage.
Active sensors can be detected by virtue of the signals they transmit. A passive radar sensor can be used to detect an actively scanning radar device, for example, or enhanced vision could be used to detect an infrared laser or a terahertz emitter’s signals.
Public meshed sensors freely give away their locations. Identifying these and their placement is an automatic affair for anyone with mesh inserts. The wireless signals of non-public sensors are also easy to scan, unless they are stealthed, in which case an Interfacing Test at −30 must be made to detect them (see Wireless Scanning). Once the signals are found, they can be triangulated using readings from any three receiving devices (which almost every character is likely to have on their person) and a successful Interfacing Test to within a range of 50 meters, −10 meters per 10 points of MoS. Several tools can also assist in detecting sensors, particularly the lens spotter, electrical sense implants, and smart dust.
=Using Sensors= 
In most circumstances, sensors can simply be assumed to work as advertised, no test necessary. Cameras record pictures, t-rays see through walls, x-rays detect implants, facial recognition IDs people, and so on. In circumstances where a sensor operation is impaired or where someone is actively trying to circumvent or fool the sensor, then a test is required. In this case, the Perception of the sensor’s operator or its built-in AI is rolled. Treat this as a Success Test when operating the sensor under difﬁculty or as an Opposed Test against someone trying to evade or trick the sensor. Various modiﬁers may apply to this test, depending on the sensor and situation; see the Perception and Sensor Test Modiﬁers table (previous page) for some suggestions.
In circumstances where the sensor is guaranteed to work, but the quality of the sensor input is in question, a Simple Success Test using Interfacing skill can be rolled to gauge how well the sensor functions. For example, if a well-situated camera is trying to grab a shot of someone in an open but crowded square, an Interfacing Test by the operator or AI will determine if the picture is clear (success) or fuzzy (failure), with MoS or MoF determining the quality.
Sensor software relies on different skills to analyze different types of sensor data. Pattern-matching algorithms use the AI’s Perception, sometimes with modiﬁers based on the size and quality of the database they are matching against. Other software may rely on various Knowledge skills, as noted. Chem sniffers, for example, rely on the AI or software’s [[Academics]]: Chemistry skill.
||||~ Perception and Sensor Test Modifiers ||
|| **Situation** || **Modifier** ||
|| Sensor of inferior quality || -10 to -30 ||
|| Sensor of superior quality || +10 to +30 ||
|| Minor sensor impairment
Standard vision: glare, light smoke, dim light
Target: minor cover || -10 ||
|| Moderate sensor impairment
Standard vision: heavy smoke, dark
Target: moderate cover
Radar: biomorphs, small targets || -20 ||
|| Major sensor impairment
Standard vision: impenetrable fog
Target: major cover || -30 ||
|| Range || -10 to -30 ||
|| Small target (child sized) || -10 ||
|| Very small target (mouse sized) || -30 ||
|| Large target (car sized) || +10 ||
|| Very large target (barn) || +30 ||
|| Hyperspectral sensors || +20 ||
|| Multiple sensors || +10 to +30 ||
|| Detailed analysis || +10 to +30 ||

=[[#Hyperspectral Sensors]]Hyperspectral Sensors= 
Also referred to as multispectral, full-spectral, and ultraspectral sensors, hyperspectral sensors receive input from multiple wavelength bands simultaneously and then compare the results against each other. The analysis of the input at different frequencies helps to identify features. Many objects and materials have unique spectral signatures across the electromagnetic spectrum, enabling easier identification when viewed with multiple frequencies at once rather than in just one band.
In game terms, hyperspectral sensors apply a +20 modifier to Perception Tests. Hyperspectral sensors are also good for defeating camouflage tricks or other circumstances that just impair perception at certain frequencies. Reduce the modiﬁers for these effects by half (for example, heavy smoke that impairs visibility by −20 is only −10 against a hyperspectral imager).
Enhanced vision and enhanced hearing bioware can all be considered hyperspectral.
=Using Multiple Sensors= 
Sometimes simply having multiple perspectives on something can aid perception—this is one of the primary beneﬁts of tacnets. At the gamemaster’s discretion, analyzing the input from multiple sensor feeds may provide a +10 to +30 modifier to Perception Tests.
||||||||~ Comprehensive Sensor Chart
//This chart details most, if not all, of the sensor systems available in Eclipse Phase// ||
|| **Sensor** || **Skill** || **Type** || **Notes** ||
|| Chem Sniffer || Academics: Chemistry || Passive || Analyzes air; detects chemicals, explosives, firearms, exhaled carbon dioxide, body odor biometrics ||
|| Electrical Sense || Perception || Passive || Detects nearby electronics, electrical fields (including those of biomorph bodies) ||
|| Face/Image/Pattern/Biometric Recognition Software || Perception || Passive || Detects faces, gaits, images, irises, personal body odor, retinas, silhouettes, skeletons, sounds, voices, or other specified patterns. ||
|| Ghost Imager || Perception || Passive || Ignores all visual modifiers under special conditions ||
|| Hyperspectral Audio || Perception || Passive || Hears all audio frequencies, infrasound through ultrasound. ||
|| Hyperspectral Imager (Passive) || Perception || Passive || Perceives infrared through ultraviolet ||
|| Hyperspectral Imager (Active) || Perception || Active || Perceives radar through ultraviolet ||
|| Infrared ||   ||   ||   ||
|| -Near infrared || Perception || Passive || Night vision, active infrared lighting applies bonus modifiers ||
|| -Mid-long infrared || Perception || Passive || Detects heat and temperature differences ||
|| -Chemical Imaging || Academics: Chemistry || Passive || Detects organic chemical composition via vibrational spectroscopy ||
|| Infrasound/Lateral Line || Perception || Passive || Detects vibrations, seismic events, footprints, heart beats, movement; long range ||
|| Kinesics Software || Kinesics || Passive || Measure stress, hostile intent, truthfulness, emotional state ||
|| LIDAR || Perception || Active || Measures speed, movement, range; useful for minute positional mapping ||
|| -Chemical Imaging || Academics: Chemistry || Active || Detects atmospheric properties and weather ||
|| Metal Detector || Perception || Active ||   ||
|| Nanoscopic Vision || Perception || Passive || Sees nano-scale objects ||
|| Polarization Vision || Perception || Passive || Sees extra visual details ||
|| Pressure Sensor || Simple Perception || Passive || Detects any weight placed upon it. ||
|| RADAR || Perception || Active || Detects speed, movement, range, concealed objects, implants; works best on metallic objects; low resolution; sees through walls; reads pulse/respiration at close range ||
|| -Quantum RADAR || Perception || Active || Better resolution ||
|| Radiation sense || Perception || Passive || Detects radiation ||
|| Radio Motion Detection || Interfacing || Passive || Detects movement, speed, range based on existing wireless signals; poor resolution ||
|| Smell || Perception || Passive ||   ||
|| Standard Audio || Perception || Passive ||   ||
|| Standard Visual || Perception || Passive ||   ||
|| Terahertz (t-ray) || Perception || Active (Passive in space) || Mid-ground between RADAR and sight; penetrates walls but is blocked by skin, metal, and water. ||
|| Ultrasound || Perception || Active || Detects movement, density; low resolution imaging. ||
|| Ultraviolet || Perception || Passive || Some chemicals fluoresce under UV Light ||
|| X-ray/Gamma-ray || Perception || Active || Detects implants, skeletal biometrics, concealed weapons, radiation; penetrates walls and metal. ||

[ [[eclipse-phase/Home|Home]] | [[eclipse-phase/Game Rules|Game Rules]] ]